Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization

نویسندگان

چکیده

This paper proposes and justifies two globally convergent Newton-type methods to solve unconstrained constrained problems of nonsmooth optimization by using tools variational analysis generalized differentiation. Both are coderivative-based employ Hessians (coderivatives subgradient mappings) associated with objective functions, which either class $${{\mathcal {C}}}^{1,1}$$ , or represented in the form convex composite optimization, where one terms may be extended-real-valued. The proposed algorithms types. first extends damped Newton method requires positive-definiteness for its well-posedness efficient performance, while other algorithm is regularized being well-defined when merely positive-semidefinite. obtained convergence rates both at least linear, but become superlinear under semismooth $$^*$$ property mappings. Problems investigated without strong convexity assumption on smooth parts functions implementing machinery forward–backward envelopes. Numerical experiments conducted Lasso box quadratic programs providing performance comparisons new some first-order second-order that highly recognized optimization.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Globally Convergent Inexact Newton Methods

Inexact Newton methods for finding a zero of F 1 1 are variations of Newton's method in which each step only approximately satisfies the linear Newton equation but still reduces the norm of the local linear model of F. Here, inexact Newton methods are formulated that incorporate features designed to improve convergence from arbitrary starting points. For each method, a basic global convergence ...

متن کامل

Nonsmooth optimization via quasi-Newton methods

We investigate the behavior of quasi-Newton algorithms applied to minimize a nonsmooth function f , not necessarily convex. We introduce an inexact line search that generates a sequence of nested intervals containing a set of points of nonzero measure that satisfy the Armijo and Wolfe conditions if f is absolutely continuous along the line. Furthermore, the line search is guaranteed to terminat...

متن کامل

Optimization Based Globally Convergent Methods for the Nonlinear Complementarity Problem

The nonlinear complementarity problem has been used to study and formulate various equilibrium problems including the traffic equilibrium problem, the spatial equilibrium problem and the Nash equilibrium problem. To solve the nonlinear complementarity problem, various iterative methods such as projection methods, linearized methods and Newton method have been proposed and their convergence resu...

متن کامل

Sub-Sampled Newton Methods I: Globally Convergent Algorithms

Large scale optimization problems are ubiquitous in machine learning and data analysis and there is a plethora of algorithms for solving such problems. Many of these algorithms employ sub-sampling, as a way to either speed up the computations and/or to implicitly implement a form of statistical regularization. In this paper, we consider second-order iterative optimization algorithms, i.e., thos...

متن کامل

Globally convergent Jacobian smoothing inexact Newton methods for NCP

A new smoothing algorithm for the solution of nonlinear complementarity problems (NCP) is introduced in this paper. It is based on semismooth equation reformulation of NCP by Fischer–Burmeister function and its related smooth approximation. In each iteration the corresponding linear system is solved only approximately. Since inexact directions are not necessarily descent, a nonmonotone techniqu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming

سال: 2023

ISSN: ['0025-5610', '1436-4646']

DOI: https://doi.org/10.1007/s10107-023-01980-2